Entropy Minimization Algorithm for Multilayer Perceptrons
نویسندگان
چکیده
We have previously proposed the use of quadratic Renyi’s error entropy with a Parzen density estimator with Gaussian kernels as an alternative optimality criterion for supervised neural network training, and showed that it produces better performance on the test data compared to the MSE. The error entropy criterion imposes the minimization of average information content in the error signal rather than simply minimizing the energy as MSE does. Recently, we developed a nonparametric entropy estimator for Renyi’s definition that makes possible the use of any entropy order and any suitable kernel function in Parzen density estimation. The new estimator reduces to the previously used estimator for the special choice of Gaussian kemels and quadratic entropy. In this paper, we briefly present the new criterion and how to apply it to MLP training. We also address the issue of global optimization by the control of the kemel size in the Parzen window estimation.
منابع مشابه
Neural Networks Trained with the EEM Algorithm: Tuning the Smoothing Parameter
The training of Neural Networks and particularly Multi-Layer Perceptrons (MLP's) is made by minimizing an error function usually known as "cost function". In our previous works, we apply the Error Entropy Minimization (EEM) algorithm in classification and its optimized version using, as cost function, the entropy of the errors between the outputs and the desired targets of the neural network. O...
متن کاملTraining Multilayer Perceptrons Via Minimization of Sum of Ridge Functions
Motivated by the problem of training multilayer perceptrons in neural networks, we consider the problem of minimizing E(x) = ∑ni=1 fi(ξi · x), where ξi ∈ Rs , 1 i n, and each fi(ξi · x) is a ridge function. We show that when n is small the problem of minimizing E can be treated as one of minimizing univariate functions, and we use the gradient algorithms for minimizing E when n is moderately la...
متن کاملNeural network classification using Shannon's entropy
The last years have witnessed an increasing attention to entropy-based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. We propose a new type of neural network classifiers with multilayer perceptron (MLP) architecture, but where the usual mean square error minimization principle is substituted by the minimizatio...
متن کاملLocal linear perceptrons for classification
A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...
متن کاملImproving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate
Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an important area of research. The last researches have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of cross entropy function. One way of entropy criteria in learning systems i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001